Optimal Weight Decay in a Perceptron
نویسنده
چکیده
Weight decay was proposed to reduce over tting as it often appears in the learning tasks of arti cial neural networks. In this paper weight decay is applied to a well de ned model system based on a single layer perceptron, which exhibits strong over tting. Since the optimal non-over tting solution is known for this system, we can compare the effect of the weight decay with this solution. A strategy to nd the optimal weight decay strength is proposed, which leads to the optimal solution for any number of examples.
منابع مشابه
Optimal regularization of linear and nonlinear perceptrons
We derive an analytical formula for the generalization error of linear adaptive classifiers trained with weight decay. Analytical and experimental results are then presented to analyze the optimal value of regularization parameters as a function of the training set size.
متن کاملAn Efficient Method for Pruning the Multilayer Perceptron Based on the Correlation of Errors
In this paper we present a novel method for pruning redundant weights of a trained multilayer Perceptron (MLP). The proposed method is based on the correlation analysis of the errors produced by the output neurons and the backpropagated errors associated with the hidden neurons. Repeated applications of it leads eventually to the complete elimination of all connections of a neuron. Simulations ...
متن کاملConvergence analysis of on-line weight noise injection training algorithms for MLP networks
Injecting weight noise during training has been proposed for almost two decades as a simple technique to improve fault tolerance and generalization of a multilayer perceptron (MLP). However, little has been done regarding their convergence behaviors. Therefore, we presents in this paper the convergence proofs of two of these algorithms for MLPs. One is based on combining injecting multiplicativ...
متن کاملOptimal information storage and the distribution of synaptic weights: Perceptron vs. Purkinje cell Supplemental Material: Computation of perceptron capacity and synaptic weight distribution
متن کامل
SNIWD: Simultaneous Weight Noise Injection with Weight Decay for MLP Training
Despite noise injecting during training has been demonstrated with success in enhancing the fault tolerance of neural network, theoretical analysis on the dynamic of this noise injection-based online learning algorithm has far from complete. In particular, the convergence proofs for those algorithms have not been shown. In this regards, this paper presents an empirical study on the non-converge...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1996